The Expectation Maximization (EM) algorithm

نویسنده

  • Max Welling
چکیده

In the previous class we already mentioned that many of the most powerful probabilistic models contain hidden variables. We will denote these variables with y. It is usually also the case that these models are most easily written in terms of their joint density, p(d,y,θ) = p(d|y,θ) p(y|θ) p(θ) (1) Remember also that the objective function we want to maximize is the log-likelihood (possibly including the prior term like in MAP-estimation) given by, L(d,θ) = log[p(d|θ)] + log[p(θ)] (2) = log[ ∫ dy p(d,y|θ)] + log[p(θ)] (3)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Explanation of the Expectation Maximization Algorithm, Report no. LiTH-ISY-R-2915

The expectation maximization (EM) algorithm computes maximum likelihood estimates of unknown parameters in probabilistic models involving latent variables. More pragmatically speaking, the EM algorithm is an iterative method that alternates between computing a conditional expectation and solving a maximization problem, hence the name expectation maximization. We will in this work derive the EM ...

متن کامل

On Regularization Methods of Em-kaczmarz Type

We consider regularization methods of Kaczmarz type in connection with the expectation-maximization (EM) algorithm for solving ill-posed equations. For noisy data, our methods are stabilized extensions of the well established ordered-subsets expectation-maximization iteration (OS-EM). We show monotonicity properties of the methods and present a numerical experiment which indicates that the exte...

متن کامل

An Improved EM algorithm

In this paper, we firstly give a brief introduction of expectation maximization (EM) algorithm, and then discuss the initial value sensitivity of expectation maximization algorithm. Subsequently, we give a short proof of EM's convergence. Then, we implement experiments with the expectation maximization algorithm (We implement all the experiments on Gaussion mixture model (GMM) ). Our experiment...

متن کامل

The basic idea behind Expectation-Maximization

3 The Expectation-Maximization algorithm 7 3.1 Jointly-non-concave incomplete log-likelihood . . . . . . . . . . . 7 3.2 (Possibly) Concave complete data log-likelihood . . . . . . . . . . 8 3.3 The general EM derivation . . . . . . . . . . . . . . . . . . . . . 10 3.4 The E& M-steps . . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.5 The EM algorithm . . . . . . . . . . . . . . . . . . ...

متن کامل

The basic idea of Expectation-Maximization

3 The Expectation-Maximization algorithm 7 3.1 Jointly-non-concave incomplete log-likelihood . . . . . . . . . . . 7 3.2 (Possibly) Concave complete data log-likelihood . . . . . . . . . . 8 3.3 The general EM derivation . . . . . . . . . . . . . . . . . . . . . 10 3.4 The E& M-steps . . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.5 The EM algorithm . . . . . . . . . . . . . . . . . . ...

متن کامل

Noisy Expectation-Maximization: Applications and Generalizations

We present a noise-injected version of the Expectation-Maximization (EM) algorithm: the Noisy Expectation Maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that injected noise speeds up the average convergence of the EM algorithm to a local maximum of the likelihood surface if a positivity condition holds. The gener...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004